17 research outputs found

    Contributions towards understanding and building sustainable science

    Get PDF
    This dissertation focuses on either understanding and detecting threats to the epistemology of science (chapters 1-6) or making practical advances to remedy epistemological threats (chapters 7-9). Chapter 1 reviews the literature on responsible conduct of research, questionable research practices, and research misconduct. Chapter 2 reanalyzes Head et al (2015) their claims about widespread p-hacking for robustness. Chapter 3 examines 258,050 test results across 30,710 articles from eight high impact journals to investigate the existence of a peculiar prevalence of pp-values just below .05 (i.e., a bump) in the psychological literature, and a potential increase thereof over time. Chapter 4 examines evidence for false negatives in nonsignificant results throughout psychology, gender effects, and the Reproducibility Project: Psychology. Chapter 5 describes a dataset that is the result of content mining 167,318 published articles for statistical test results reported according to the standards prescribed by the American Psychological Association (APA). In Chapter 6, I test the validity of statistical methods to detect fabricated data in two studies. Chapter 7 tackles the issue of data extraction from figures in scholarly publications. In Chapter 8 I argue that "after-the-fact" research papers do not help alleviate issues of access, selective publication, and reproducibility, but actually cause some of these threats because the chronology of the research cycle is lost in a research paper. I propose to give up the academic paper and propose a digitally native "as-you-go" alternative. In Chapter 9 I propose a technical design for this

    688,112 statistical results:Content mining psychology articles for statistical test results

    Get PDF
    In this data deposit, I describe a dataset that is the result of content mining 167,318 published articles for statistical test results reported according to the standards prescribed by the American Psychological Association (APA). Articles published by the APA, Springer, Sage, and Taylor & Francis were included (mining from Wiley and Elsevier was actively blocked). As a result of this content mining, 688,112 results from 50,845 articles were extracted. In order to provide a comprehensive set of data, the statistical results are supplemented with metadata from the article they originate from. The dataset is provided in a comma separated file (CSV) in long-format. For each of the 688,112 results, 20 variables are included, of which seven are article metadata and 13 pertain to the individual statistical results (e.g., reported and recalculated p-value). A five-pronged approach was taken to generate the dataset: (i) collect journal lists; (ii) spider journal pages for articles; (iii) download articles; (iv) add article metadata; and (v) mine articles for statistical results. All materials, scripts, etc. are available at https://github.com/chartgerink/2016statcheck_data and preserved at http://dx.doi.org/10.5281/zenodo.59818

    “As-You-Go” instead of “After-the-Fact”:A network approach to scholarly communication and evaluation

    Get PDF
    Scholarly research faces threats to its sustainability on multiple domains (access, incentives, reproducibility, inclusivity). We argue that “after-the-fact” research papers do not help and actually cause some of these threats because the chronology of the research cycle is lost in a research paper. We propose to give up the academic paper and propose a digitally native “as-you-go” alternative. In this design, modules of research outputs are communicated along the way and are directly linked to each other to form a network of outputs that can facilitate research evaluation. This embeds chronology in the design of scholarly communication and facilitates the recognition of more diverse outputs that go beyond the paper (e.g., code, materials). Moreover, using network analysis to investigate the relations between linked outputs could help align evaluation tools with evaluation questions. We illustrate how such a modular “as-you-go” design of scholarly communication could be structured and how network indicators could be computed to assist in the evaluation process, with specific use cases for funders, universities, and individual researchers

    Synthesis and Alcoholysis of α-Alkylated Cyclopentane and Cyclohexane Fused Succinic Racemic Anhydrides in the Presence of Chiral Bases

    Get PDF
    Bicyclic succinic anhydrides alkylated at the α-position have been prepared and submitted to alcoholysis in the presence of alkaloid bases. Anhydrides with a cyclopentane fused ring, open only from the less hindered side, generating monoesters of >80 % ee, whereas cyclohexane fused anhydrides undergo parallel kinetic resolution, producing both regioisomeric monoesters

    686,220 statistical results: Content mining psychology articles for statistical test results

    No full text
    *** Please note: this dataset has been replaced by a new version: see relations for the update*** A dataset of 686,220 statistical test results reported according to the standards prescribed by the American Psychological Association (APA), mined from 50,740 articles out of 276,669 published by the APA, Springer, Sage, and Taylor & Francis. Mining from Wiley and Elsevier was actively blocked. Metadata for each article are included. Metadata for each article are included. All scripts, etc. are available at github.com/chartgerink/2016statcheck_dat

    688,112 statistical results: Content mining psychology articles for statistical test results

    No full text
    A dataset of 688,112 statistical test results reported according to the standards prescribed by the American Psychological Association (APA), mined from 50,845 articles out of 167,318 published by the APA, Springer, Sage, and Taylor & Francis. Mining from Wiley and Elsevier was actively blocked. Metadata for each article are included. All journals included, scripts, etc. are available at https://github.com/chartgerink/2016statcheck_data and preserved at http://dx.doi.org/10.5281/zenodo.5981

    Research practices and assessment of research misconduct

    Get PDF
    This article discusses the responsible conduct of research, questionable research practices, and research misconduct. Responsible conduct of research is often defined in terms of a set of abstract, normative principles, professional standards, and ethics in doing research. In order to accommodate the normative principles of scientific research, the professional standards, and a researcher’s moral principles, transparent research practices can serve as a framework for responsible conduct of research. We suggest a “prune-and-add” project structure to enhance transparency and, by extension, responsible conduct of research. Questionable research practices are defined as practices that are detrimental to the research process. The prevalence of questionable research practices remains largely unknown, and reproducibility of findings has been shown to be problematic. Questionable practices are discouraged by transparent practices because practices that arise from them will become more apparent to scientific peers. Most effective might be preregistrations of research design, hypotheses, and analyses, which reduce particularism of results by providing an a priori research scheme. Research misconduct has been defined as fabrication, falsification, and plagiarism (FFP), which is clearly the worst type of research practice. Despite it being clearly wrong, it can be approached from a scientific and legal perspective. The legal perspective sees research misconduct as a form of white-collar crime. The scientific perspective seeks to answer the following question: “Were results invalidated because of the misconduct?” We review how misconduct is typically detected, how its detection can be improved, and how prevalent it might be. Institutions could facilitate detection of data fabrication and falsification by implementing data auditing. Nonetheless, the effect of misconduct is pervasive: many retracted articles are still cited after the retraction has been issued. Main points Researchers systematically evaluate their own conduct as more responsible than colleagues, but not as responsible as they would like. Transparent practices, facilitated by the Open Science Framework, help embody scientific norms that promote responsible conduct. Questionable research practices harm the research process and work counter to the generally accepted scientific norms, but are hard to detect. Research misconduct requires active scrutiny of the research community because editors and peer-reviewers do not pay adequate attention to detecting this. Tips are given on how to improve your detection of potential problems

    688,112 statistical results: Content mining psychology articles for statistical test results

    No full text
    A dataset of 688,112 statistical test results reported according to the standards prescribed by the American Psychological Association (APA), mined from 50,845 articles out of 167,318 published by the APA, Springer, Sage, and Taylor & Francis. Mining from Wiley and Elsevier was actively blocked. Metadata for each article are included. All journals included, scripts, etc. are available at https://github.com/chartgerink/2016statcheck_data and preserved at http://dx.doi.org/10.5281/zenodo.5981
    corecore